Copied to clipboard

Flag this post as spam?

This post will be reported to the moderators as potential spam to be looked at


  • Ben McKean 272 posts 549 karma points
    Jan 25, 2012 @ 15:50
    Ben McKean
    0

    Robots.txt still showing http://{HTTP_HOST}/sitemap

    Hi

    I've installed this package and everything seems to have installed correctly however when I browse to my site I still the contents of robots.txt is:

    User-agent: *
    Disallow: /
    Sitemap: http://{HTTP_HOST}/sitemap

    The dlls are definitely in the bin directory and web.config looks as it should to me but the replace on the {HTTP_HOST} doesn't look like its happening.

    Whilst testing I've noticed that any changes I make to web.config don't seem to make the page recompile when refreshing the robots.txt. Whereas if I make a change to web.config and then view a normal page within the site it takes the morning 10 seconds to reload.

    Am I missing something else? Anything else I need to do in IIS?

    Thanks

    Ben

     

  • Sebastiaan Janssen 5060 posts 15522 karma points MVP admin hq
    Jan 25, 2012 @ 15:53
  • Ben McKean 272 posts 549 karma points
    Jan 25, 2012 @ 16:19
    Ben McKean
    101

    Hi

    Thanks for the quick reply.

    I did read through that post but think it may have been something else.

    I was testing it in IIS6, I quickly tried it in IIS7 and it works fine so must've been something to do with that. Luckily our production server is using IIS7 so thats fine.

    Thanks

    Ben

Please Sign in or register to post replies

Write your reply to:

Draft